Data-Driven Learning of Feedforward Neural Networks with Different Activation Functions

نویسندگان

چکیده

This work contributes to the development of a new data-driven method (D-DM) feedforward neural networks (FNNs) learning. was proposed recently as way improving randomized learning FNNs by adjusting network parameters target function fluctuations. The employs logistic sigmoid activation functions for hidden nodes. In this study, we introduce other functions, such bipolar sigmoid, sine function, saturating linear reLU, and softplus. We derive formulas their parameters, i.e. weights biases. simulation evaluate performance FNN with different functions. results indicate that perform much better than others in approximation complex, fluctuated

برای دانلود باید عضویت طلایی داشته باشید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Learning in Feedforward Networks with Nonsmooth Functions

This paper is concerned with the problem of learning in networks where some or all of the functions involved are not smooth. Examples of such networks are those whose neural transfer functions are piecewise-linear and those whose error function is defined in terms of the 100 norm. Up to now, networks whose neural transfer functions are piecewise-linear have received very little consideration in...

متن کامل

Learning Stochastic Feedforward Neural Networks

Multilayer perceptrons (MLPs) or neural networks are popular models used for nonlinear regression and classification tasks. As regressors, MLPs model the conditional distribution of the predictor variables Y given the input variables X . However, this predictive distribution is assumed to be unimodal (e.g. Gaussian). For tasks involving structured prediction, the conditional distribution should...

متن کامل

Corruption of Generalizing Signals in Densely Connected Feedforward Neural Networks with Hyperbolic Tangent Activation Functions

This paper discusses the propagation of signals in generic densely connected multilayered feedforward neural networks. It is concluded that the dense connecting combined with the hyperbolic tangent activation functions of the neurons may cause a highly random, spurious generalization, that decreases the overall performance and reliability of a neural network and can be mistaken with overfitting...

متن کامل

Fast Feedforward Neural Networks with Diffused Nonlinear Weight Functions

In this paper, feedforward neural networks are presented that have nonlinear weight functions based on look--up tables, that are specially smoothed in a regularization called the diffusion. The idea of such a type of networks is based on the hypothesis that the greater number of adaptive parameters per a weight function might reduce the total number of the weight functions needed to solve a giv...

متن کامل

Distributed learning algorithm for feedforward neural networks

With the appearance of huge data sets new challenges have risen regarding the scalability and efficiency of Machine Learning algorithms, and both distributed computing and randomized algorithms have become effective ways to handle them. Taking advantage of these two approaches, a distributed learning algorithm for two-layer neural networks is proposed. Results demonstrate a similar accuracy whe...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

ژورنال

عنوان ژورنال: Lecture Notes in Computer Science

سال: 2021

ISSN: ['1611-3349', '0302-9743']

DOI: https://doi.org/10.1007/978-3-030-87986-0_6